(navigation image)
Home American Libraries | Canadian Libraries | Universal Library | Community Texts | Project Gutenberg | Children's Library | Biodiversity Heritage Library | Additional Collections
Search: Advanced Search
Anonymous User (login or join us)
Upload
See other formats

Full text of "Raw_Thought-txt"

What are the optimal biases to overcome?
This is a bonus post for my seriesRaw Nerve. It originally appeared insomewhat different formon Less Wrong.
I’ve noticed that some people have complimented my seriesRaw Nerveby saying it’s a great explanation of cognitive biases. Which always amuses me, since the series grew out of frustrations I had with the usual way that term gets used. There’s a group of people (call them the cognitive bias community) who say the way to be more rational — to get better at making decisions that get you what you want — is to work at overcoming your biases. But if you’re overcoming biases, surely there are some lessons that will help you more than others.
You might start with the most famous ones, which tend to be the ones popularized by Kahneman and Tversky. But K&T were academics. They weren’t trying to help people be more rational, they were trying to prove to other academics that people were irrational. The result is that they focused not on the most important biases, but the ones that were easiest to prove.
Take their famous anchoring experiment, in which they showed the spin of a roulette wheel affected people’s estimates about African countries. The idea wasn’t that roulette wheels causing biased estimates was a huge social problem; it was that no academic could possibly argue that this behavior was somehow rational. They thereby scored a decisive blow for psychology against economists claiming we’re just rational maximizers.
Most academic work on irrationality has followed in K&T’s footsteps. And, in turn, much of the stuff done by the wider cognitive bias community has followed in the footsteps of this academic work. So it’s not hard to believe that cognitive bias types are good at avoiding these biases and thus do well on the psychology tests for them. (Indeed, many of the questions on these tests for rationality come straight from K&T experiments!)
But if you look at the average person and ask why they aren’t getting what they want, very rarely do you conclude their biggest problem is that they’re suffering from anchoring, framing effects, the planning fallacy, commitment bias, or any of the other stuff in these tests. Usually their biggest problems are far more quotidian and commonsensical, like procrastination and fear.
One of the things that struck me was watching Eliezer Yudkowsky, one of the most impressive writers on the topic of cognitive biases, try to start a new nonprofit. For years, the organization he founded struggled until recently, when Luke Muehlhauser was named executive director. Eliezer readily agrees that Luke has done more to achieve Eliezer’s own goals for the organization than Eliezer ever did.
But why? Why is Luke so much better at getting what Eliezer wants than Eliezer is? It’s surely not because Luke is so much better at avoiding the standard cognitive biases! Luke often talks about how he’s constantly learning new rationality techniques from Eliezer.
No, it’s because Luke did what seems like common sense: he bought a copy ofNonprofits for Dummiesand did what it recommends. As Luke himselfsays, it wasn’t lack of intelligence or resources or willpower that kept Eliezer from doing these things, “it was a gap in general rationality.”
So if you’re interested in closing the gap, it seems like the skills to prioritize aren’t things like commitment effect and the sunk cost fallacy, but stuff like “figure out what your goals really are”, “look at your situation objectively and list the biggest problems”, “when you’re trying something new and risky, read the For Dummies book about it first”, etc. That’s the stuff I’m interested in writing about.
You should follow me on twitterhere.
August 29, 2012